Seconde Nature Second Life

Last changed: 2009/02/07 23:39 

 

Locus Sonus at Seconde Nature

Alejo Duque, Scott Fitzgerald

The first notes on the project are here: Seconde Nature

ADD YOUR AFTER THOUGHTS RELATED TO THE SECOND LIFE PROJECT PRESENTED AT THE SECOND NATURE FESTIVAL

http://nujus.net/%7Elocusonus/dropbox/alejo/SNLS/building_SL_process.jpg

http://nujus.net/%7Elocusonus/dropbox/alejo/SNLS/SL_LS_domo_verde.jpg

http://nujus.net/%7Elocusonus/dropbox/alejo/SNLS/etranger.jpg

SL Tracking

Background

The process required to get the position of an object from Second Life into a Pure Data patch is fairly straightforward. There are some “gotchas” outlined below. These primarily lie in the Second Life domain.

Second Life has its own scripting language, with the uninspired name of Linden Scripting Language (LSL). The scripts can be attached to any in-world object you have appropriate permissions for, including your body.

LSL allows for http requests to be made from inside the world to connect to external content (web pages, audio, video). By making a request to a webserver running PHP, we can send the x,y,z coordinates of the object we are interested in. In this instance we are also sending the object’s rotation along the z-axis in radians.

When the request is made, the PHP parses the information, and opens a UDP socket on the local machine. We have a Pure Data patch listening on the socket for incoming information. The information is routed according to the object’s name, giving us the position of each object inside the virtual space.

Implementation

There were several different iterations of the LSL script, for various reasons. LSL throttles http requests, limiting them to about 1 second per object. If you have a request attached to a large object (about the size of an average avatar), the object loses “energy” over time, it cannot continue to make requests until that “energy” has replenished (achieved by not making requests).

Another issue we encountered is that the objects will report bogus location data, intermittently reporting their location was somewhere different than where they currently were. This happened irregularly, and usually appeared after about an hour of “rezzing” an object with the script.

The original script we were using only reported an object’s beginning and end position. No updates between were registered. This proved problematic, as objects could move from one side of the space to the other, sounding as if it happened instantaneously.

The second iteration had the objects constantly updating their location, whether they were being moved or not. This became problematic for the reason mentioned above, they would erroneously report their location, stating they were somewhere else (often time this appeared to be a previous location).

A compromise was reached, whereby the object would report its location every second while it was being touched. Random location data would still sneak in while a participant would be moving something. However, it was much less frequent and would not cause problems once the object was positioned. One unfortunate side effect of this is that if an object’s position was changed through collision with another object or avatar, or if momentum carried it away from the place it was last touched, its location would no longer be accurate.

One other bit that needed to be added to the LSL was a boundary checker. We did not want the object to leave the confines of the Locus Sonus parcel of land. However, there was ample opportunity for people to move the objects out of our space. This was alleviated through a small script that constantly checked if the object was within our land’s coordinates. If it was not, it would be turned “phantom” (to allow the object to pass through walls), moved to the silent zone. Once reaching the resting place, it would be returned to its normal “physical” state.

Code

The LSL bit


//get the unique ID of the object
key http_request_id;
//the url/path of the webserver & php
string base_url = "http://213.56.149.52/put_data.php";
//variables
vector my_pos = <0,0,0>;
float timetowait = 1.;

default
{
    state_entry()
	//when the object first appears
 {
	//start a timer that goes off every second
   llSetTimerEvent(timetowait);
  }

timer()
	//when the timer goes off
{
	//get the objects location
   my_pos = llGetPos();
	//get x/y coordinates
 float x_pos = my_pos.x;
float y_pos = my_pos.y;
//check to see if the object is inside the Locus space
if (y_pos < 159 || y_pos > 208 || x_pos > 99 || x_pos < 60){
    //if it is not, make it pahntom and move it to the silent place
    llSetStatus(STATUS_PHANTOM, TRUE);
    llMoveToTarget(<96.943,202.607,22.441>,.2);
} else {
//if it is inside the Locus space
//make it physical, and stop moving it
     llSetStatus(STATUS_PHANTOM, FALSE);
   llStopMoveToTarget();
    
}
}
     touch(integer num_detected){
         //if object is being touched
         
//get rotation
   float rot = llRot2Angle(llGetRot());
//get the objects name
   string name = llKey2Name(llGetKey());
//pack the name, location, and rotation into a stricg 
   string request = "?type=avatar&name=" + name + "&data=" +(string)my_pos + "%20" + (string)rot;

//make the http request
   http_request_id = llHTTPRequest(base_url + request, [], "");
   
//wait for one second
   llSleep(1.);
   
  

 }
}

2) php script

{<?php

/*

Script for getting object type, name and data via http

and sending to a udp socket

-Scott Fitzgerald apr 08 based on

- Robb Drinkwater, Aug. '07, Jan. '08

  • /

$type = $_REQUEST'type'; // get object type

$obj_name = $_REQUEST'name'; // get object name

$pd_data = $_REQUEST'data'; // get object data

echo "<h2>UDP Connection</h2>\n";

/* Get the90.4.219.15address for the target host. */

$address = gethostbyname('localhost');

/* Create a UDP/IP socket. */

$socket = socket_create(AF_INET, SOCK_DGRAM, SOL_UDP);

if ($socket === false) {

echo "socket_create() failed: reason: " . socket_strerror(socket_last_error()) . "\n";

else {

echo "OK.\n";

}

//open a port

echo "Attempting to connect to '$address' on port '13001'...";

$result = socket_connect($socket, $address, 13001);

if ($result === false) {

echo "socket_connect() failed.\nReason: ($result) " . socket_strerror(socket_last_error($socket)) . "\n";

} else {

echo "OK.\n";

}

/* Catch if data is Second Life vector format and reformat */

if(strstr($pd_data, '<') == true)

{

//print "found vector format";

$as_list = str_replace(array("<",">"), "", $pd_data); // strip lt/gt

$as_list = str_replace(",", " ", $as_list); // replace commas

$pd_data = $as_list;

}

$formatted = $type." ".$obj_name." ".$pd_data."\n"; // Formatted as raw 'type','name','data' list

//if the "formated" string length is greater than 1 (i.e. we assume it gets data)

//open socket connection and send data

if (strlen($formatted) > 1)

{

socket_send($socket, $formatted, strlen($formatted), MSG_DONTROUTE);

}

?> }}

3)pd patch

#N canvas 0 22 450 300 10;
#X obj 67 67 netreceive 13001 1;
#X obj 67 92 route avatar;
#X obj 67 118 route Object1;
#X obj 67 146 unpack 0 0 0 0;
#X floatatom 67 177 5 0 0 0 - - -;
#X floatatom 94 197 5 0 0 0 - - -;
#X floatatom 121 217 5 0 0 0 - - -;
#X floatatom 148 172 5 0 0 0 - - -;
#X text 65 191 x;
#X text 91 215 y;
#X text 120 240 z;
#X text 145 194 rotation (in radians);
#X obj 277 124 expr ($f1*360.)/6.283185;
#X floatatom 277 157 5 0 0 0 - - -;
#X text 313 158 degrees;
#X connect 0 0 1 0;
#X connect 1 0 2 0;
#X connect 2 0 3 0;
#X connect 3 0 4 0;
#X connect 3 1 5 0;
#X connect 3 2 6 0;
#X connect 3 3 7 0;
#X connect 7 0 12 0;
#X connect 12 0 13 0;

Conclusions

I think it would be best for Locus Sonus to consider moving to a different platform, or tool, for creating aural networked virtual spaces.

Second Life has one thing going for it: a built in user base. However we did not capitalize on that base for a number of reasons. With an estimated population density of 87 people per sq km, one has to wonder what second life really offers us in terms of interaction with people from around the world.

A last minute email sent to the locus sonus email list, and no "in world" publicity accounted for (from what I saw) 2 visitors in second life who visited the space during the seconde nature festival, that were not physically present at the event.

If the population of Seconf Life is 1) not informed about LS and the work and 2) nowhere near the event at the time, then it defeats the purpose of using such a space, as the pre-exisiting community offers us nothing.

The participants at seconde nature certainly enjoyed themselves, but it's hard to say how many of them thought of it as a "game" or as a "second life thing" and not as a sound work. wrestling with uncumbersome controls and using machines that are not designed to run the simulator did not help people experience what the work should have offered.

Also, as we witnessed, the second life scripting language has a large number of flaws that seriously inhibit accurate position tracking, making the environment not ideal for what we wished to achieve.

Drawing from the above, we can assume that Second Life failed us for the following reasons :

1) Poor UI : the controls are awful and people in the space who had never used SL before did not know what they were doing (even the SL team, having used it for several weeks prior, was still unable to control avatars and objects precisely)

2) unreliable information : the second life scripting language sends bogus data and is generally prohibitive for this type of communcation

3) Lack of Community : the one asset that second life does have, a large user base, is offset by the fact that a) there was no "in world" advertising of the event and b) population density in the simulator is so low that a "walk-in" is unlikely

4) misunderstanding of the intent : people perceived the whole experience as a "game" and their take-away experience was more about that aspect, the "game second life," and not a sound piece, detracts from the intent.

Having said all that, I think the ideas we worked with (virtualized sound spatialization/linking the virtual and the real), are valid points of investigation. Perhaps LS could explore various other virtual spaces to work with. Panda http://panda3d.org/ , as mentioned by Alejo, and used by SAIC, can run over the network, and allow many people to log in remotely. Apparently it has a sound synthesis engine as well. Ogre http://www.ogre3d.org/ is another open source 3d engine, though I am not sure of it's ability to be networked (interesting project though : http://jitogre.org is a project that exposes Ogre to Max/MSP/Jitter).

Of course, there's also the possibility of creating a 3d environment in GEM, and streaming it to clients, for another approach. Obviously it wouldn't have all the functionality of an ogre or panda, but could serve as a simple sandbox).

http://nujus.net/%7Elocusonus/dropbox/alejo/SNLS/logo_locus_sonus_sl.jpg